专利摘要:
input device, information processing device, input value acquisition method, and non-transitory computer-readable recording medium. the gui screen image is a standard screen image, and displays a first combined gui area, which is a combination of a directional switch gui and a joystick gui, and a second combined gui area, which is a combination of gui of the operation buttons in number of four and a gui of a joystick, on the bottom left and bottom right side of the screen image, respectively. depending on an area in the first combined gui area or the second combined gui area that a user touches again, it is determined which combined gui will be used and a screen image is switched, and if a finger or thumb is removed, the image of screen switches back.
公开号:BR102012002995B1
申请号:R102012002995-2
申请日:2012-02-09
公开日:2020-09-15
发明作者:Yoshimitsu Funabashi;Shunsuke Kunieda;Brian Johnson;Jorge Furuya
申请人:Sony Computer Entertainment Inc;
IPC主号:
专利说明:

1. Field of the Invention
[001] The present invention relates to an input device that receives an input operation by a user using a hand, an information processing device, and an input value acquisition method used on those devices. 2. Description of the Related Art
[002] In recent years, compact-sized information devices that are presumed to be taken with you, such as handheld game devices, mobile phones, PDAs (Personal Digital Assistants), or the like have become popular. For these compact size devices, data entry means are limited due to the size limitation. As a result, data entry means or specialized functions for compact size devices have been developed in a unique way. For example, covering the surface of a viewfinder with a touch panel and allowing a finger or touch pen to enter data gives a user a feeling as if he / she directly operates an object or similar displayed on a screen display.
[003] On the other hand, an environment to perform information processing using these compact size devices at a level similar to that of game console or personal computers, is also being prepared. For example, allowing a user to operate a compact-sized device and allowing a console device connected to the compact-sized device over a network to perform real-world information processing, is making it possible to enjoy advanced gameplay regardless of location. of user. In addition, it is becoming possible, by emulating a game for a console device, to play with a reduced size device.
[004] In this way, a technological direction, which allows a device to perform information processing, such as a game or similar, regardless of the size of the device, or of an environment in which the device is used, has been seen in recent years . However, when attempting to perform this highly developed information processing using a reduced size device, there is a problem of poor operability resulting from the limitation of data entry means as described above. SUMMARY OF THE INVENTION
[005] The present invention addresses the aforementioned issue, and an objective of the same is to provide a technology capable of implementing a data entry medium that has favorable operability, even with the limitation in its size.
[006] In accordance with an embodiment of the present invention, an input device is provided. The input device includes: a GUI image generating unit operative to generate a GUI (Graphical User Interface) image; an operating display device for displaying the GUI image generated by the GUI image generating unit; a touch sensitive panel operative to coat the display device and operative to detect a position where a user makes contact with the display device; and an operative operation information converter unit to identify an operation performed by the user based on a correspondence relationship between a contact point detected by the touch panel and the GUI image being displayed, in which the GUI image generating unit provides a Combined GUI area in the GUI image, the combined GUI area combining a plurality of GUIs by means of combined graphics that are a combination of at least part of the graphics of the plurality of GUIs, and when a user makes contact again with the combined GUI, the unit operation information converter specifies a GUI corresponding to a graph that includes a point from which the contact is initiated, said GUI included in the plurality of GUIs that are combined by the combined GUI and the GUI image generating unit allows a plurality of GUIs share the same detection area on the touch panel by switching the combined GUI to the said GUI identified by operating information converter unit.
[007] In accordance with another embodiment of the present invention, an information processing device is provided. The information processing device includes: a GUI image generating unit operative to generate a GUI image (Graphical User Interface); an operative information processing unit to perform information processing according to an operation made to the GUI by a user; an operating display device for displaying a GUI image generated by the GUI image generating unit as an on-screen display in an output image generated as a result of information processing performed on the information processing device; a touch sensitive panel operative to coat the display device and operative to detect a position where a user makes contact with the display device; and an operative operation information converter unit to identify an operation performed by the user based on a correspondence relationship between a contact point detected by the touch panel and the GUI image being displayed, in which the GUI image generating unit provides a Combined GUI area in the GUI image, the combined GUI area combining a plurality of GUIs by means of combined graphics that are a combination of at least part of the graphics of the plurality of GUIs, and when a user makes contact again with the combined GUI, the unit operation information converter identifies a GUI corresponding to a graph that includes a point from which the contact is initiated, dictates a GUI included in the plurality of GUIs that are combined by the combined GUI, and the GUI image generating unit allows a plurality of GUIs share the same detection area on the touch panel by switching the combined GUI to said identified GUI p it converting unit of operation information.
[008] In accordance with yet another embodiment of the present invention, an input value acquisition method is provided. The input value acquisition method includes: generating a GUI (Graphical User Interface) image; displaying the GUI image on a display device as an on-screen display in an output image generated as a result of information processing; detect by the touch-sensitive panel that covers the display device, a position where a user makes contact; and identify an operation performed by the user based on a correspondence relationship between a detected contact point and a GUI image being displayed, in which the GUI image generating unit provides a combined GUI area in the GUI image, the combined GUI area combining a plurality of GUIs by means of combined graphs that are a combination of at least part of graphs from the plurality of GUIs, and when a user makes contact again with the combined GUI, the identification of an operation identifies a GUI corresponding to a graph that includes a point from which contact is initiated, a GUI is included in the plurality of GUIs that are combined by the combined GUI, and the GUI imaging unit allows a plurality of GUIs to share the same detection area on the touch panel by switching the combined GUI for said identified GUI.
[009] Optional combinations of the aforementioned constituent elements, and implementations of the invention in the form of methods, equipment, systems, computer programs, or the like, can also be reproduced as additional modes of the present invention.
[0010] According to the present invention, a wide range of operations can be implemented while maintaining favorable operability, even with a compact size device. BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The modalities will now be described, by way of example only, with reference to the attached drawings which are intended to be exemplary, not limiting, and in which equal elements receive the same reference number in several figures, where: A fig. 1 shows an example structure of a commonly used controller. Fig. 2 shows an example external view of an information processing device applying an input device according to an embodiment. Fig. 3 shows the detailed structure of the information processing device according to the modality. Fig. 4 shows an example layout of GUIs in a GUI screen image according to the modality. Fig. 5 is a diagram illustrating an image to be displayed in a first combined GUI area and a detection area on a touch panel according to the modality. Fig. 6 is a diagram illustrating an image to be displayed in a joystick entry area and a method of manipulating it according to the modality. Fig- 7 shows a variation of the GUI screen image according to the modality. Fig. 8 shows a variation of an L1 / L2 button input area according to the modality. Fig- 9 shows another example layout of the GUI screen image and a content screen image according to the modality. Fig- 10 shows an example configuration screen image of the GUI display mode according to the modality. DETAILED DESCRIPTION OF THE INVENTION
[0012] According to one embodiment, an input device in a compact size information device, such as a mobile phone, a mobile terminal or the like is implemented. The input device has operability similar to that of a game console controller, or similar. First, an explanation will be given of an example of a commonly used controller. Fig. 1 shows an example structure of a commonly used controller. A controller 120 comprises directional switches 121, joysticks 127a and 127b, four-button operation buttons 126, L1 / L2 buttons 130a, and R1 / R2 buttons 130b, as operating means to allow manipulation by a user. The four operating buttons 126 comprise a circular button 122, a cross button 123, a square button 124, and a triangular button 125.
[0013] Directional keys 121 are configured to allow a user to operate in one of four directions (up, down left and right), or eight directions (up, down, left and right and four directions between the four directions mentioned above), or trigger in an arbitrary direction. For example, directional keys 121 are used to move a cursor on a screen image of a display device, or to scroll various types of information on a screen image. For the four operating buttons 126, different functions between buttons are respectively allocated by an application program.
[0014] Joysticks 127a and 127b comprise a rod supported in order to be able to tilt in an arbitrary direction, and a sensor that detects the amount of tilt. The stick is propelled by a means of propensity (for example, a spring or similar) towards a neutral position, and moves back to the neutral position when not manipulated. The sensor includes a variable resistor, which changes its resistance value according to the slope of the rod, and an A / D converter circuit that converts the resistance value into a digital value. If the stick is tilted, the degrees of inclination in a plurality of reference directions are converted into digital values, respectively, and the values are transmitted to a gaming device or the like as operating signals.
[0015] The L1 / L2 buttons 130a and the R1 / R2 buttons 130b are configured with two buttons respectively, that is, an LI button and an L2 button, and an RI button and an R2 button. In both combinations, two buttons are arranged in an upper position and a lower position respectively on the side surface of the controller 120. For example, a button is used to change the line of sight direction in a game, or used to add movement different when the button is manipulated simultaneously with another button. However, several functions are also allocated to these buttons by an application program.
[0016] A user holds a left handle zone 128a by the left hand and a right handle zone 128b by the right hand, and manipulates the controller 120. Directional keys 121, joysticks 127a and 127b, and number operation buttons four 126 are provided on the top surface of the controller 120 so that the user can manipulate them while holding the left handle zone 128a and the right handle zone 128b by the left and right hands, respectively. The L1 / L2 buttons 130a and the R1 / R2 buttons 130b are provided on a surface positioned on the other side of the left grip zone 128a and the right grip zone 128b so that the buttons can be manipulated by the index fingers of the left and right.
[0017] According to a modality, each of the operating means included in the controller shown in fig. 1 is recreated on a flat surface like a GUI. An explanation will be given below about an input device according to the modality. Fig. 2 shows an example external view of an information processing device applying the input device according to the modality. The information processing device 10 is a compact size device that can be carried by a user, and can be any one between a mobile phone, a PDA, a portable gaming device, or the like. Alternatively, the information processing device 10 can be provided with a function, which is a combination of the functions of those devices. Therefore, the information processing device 10 can comprise various processing mechanisms according to the functions. However, an explanation of it will be omitted as appropriate, since commonly used technologies can be applied to it.
[0018] The information processing device 10 is provided with a configuration in which a display 14 is placed on the front face of the main body and an activation key 40 is placed on a side face. At the back of the display 14, mechanisms required for a variety of types of information processing, such as a CPU, graphics processor unit, sound processor, memory, or the like are incorporated (not shown). The display 14 can be any of the commonly used displays, such as a liquid crystal display, an EL (electroluminescent) display, a plasma display, or the like. The top surface of the display 14 is covered with a touch panel 12. The touch panel 12 is implemented by any of the methods put into practical use, such as a resistive type, an optical type, or a capacitive type. .
[0019] In addition, the information processing device 10 may comprise a speaker to produce sounds, a headphone connection terminal, an infrared port or wireless LAN mechanism for communicating with other devices, a battery box, or similar. However, these elements are not shown in fig. 2.
[0020] On the display 14, a screen image that is necessary to allow a user to enter an operation (for example, a menu screen image, an icon, etc.), a screen image resulting from information processing (for example, a game screen image, a mobile image playback screen image, a screen image displaying text, a screen image displaying photography, etc.) are displayed according to a function. In addition, a GUI (Graphical User Interface) to allow a user to enter an operation while watching this type of image is displayed as an on-screen display.
[0021] The user enters an operation on the information processing device 10 by touching the touch panel 12 with a thumb or other finger of one hand, or by sliding a thumb or other finger of one hand on the sensitive panel touch 12 as he / she manipulates the GUI. In fig. 2, a directional key input area 42 that displays a graph of the directional keys, and a button input area 44 that displays a graph of the four operation buttons (i.e., the circular button, the cross button, the square button, and the triangular button) are equipped like the GUI. In fig. 2, dashed lines surrounding the directional key entry area 42 and the button entry area 44 are merely for illustration of the boundaries of the areas and thus are not related to actual display or functions. The same applies to the following figures.
[0022] For example, when selecting a desired item on a menu screen, first, a user moves an item that he wants to be displayed with emphasis included in the names of items or icons displayed as a list by touching one of the directional keys on the directional key entry area 42, and confirm the item by tapping the circular button in the button entry area 44. In addition, the user changes the moving direction of a character that appears in a game by touching one of the directional keys in the directional keys input 42. Alternatively, in an interactive game, the user indicates the intention, for example, "yes" by touching the circular button, or "no" by touching the crossed button.
[0023] In this way, an input operation implemented providing the directional key input area 42 and / or the button input area 44 can be changed in various ways by allocating buttons according to respective functions implemented by the information processing device 10. According to the modality, by recreating a data input from a game console or a personal computer in a touch panel style, a range of input operations on compact sized information devices can be as diverse as that of a game console or similar.
[0024] In addition, a game, which a user has become accustomed to playing with a game console, can also be played with a compact-sized information device with similar operability, without bringing discomfort to the user. The illustrated formats or the markings of the directional key entry area 42 or the button entry area 44 are shown as examples only, so the shapes or markings are not intended to be limited to those shown in fig. 2. The directional key input area 42 or the button input area 44 can be replaced with another data input medium as appropriate, according to a controller intended to be recreated.
[0025] FIG. 3 shows the detailed structure of the information processing device 10. In addition to the touch panel 12 and the display 14 described above, the information processing device 10 includes a content memory 16, a GUI image memory 18, a unit input / output control unit 20, an operating information converter unit 22, a content processing unit 24, a GUI image generating unit 26, a GUI image buffer 28, a content image generating unit 30, a content image buffer 32, and an image synthesizer unit 34. Content memory 16 stores content programs and / or various types of data. GUI image memory 18 stores clip art graphic data provided as a GUI. The input / output control unit 20 controls the reception of a signal input from the touch panel 12 and / or image data input / output. The operating information converter unit 22 converts the signal input from the touch panel 12 to information on operation details. The content processing unit 24 processes content according to the details of operation. The GUI image generating unit 26 generates an image of a GUI. The GUI image buffer 28 temporarily stores the generated image from the GUI. The content image generating unit 30 generates an image of the content. The content image buffer 32 temporarily stores the generated image of the content. The image synthesizer unit 34 generates an image in which the GUI image is displayed as an on-screen display in the content image.
[0026] The elements shown in fig. 3 as functional blocks for executing various processes are implemented in hardware, such as a CPU, memory, or other LSI's, and in software, such as a program that processes content or performs image processing, etc. Therefore, it will be obvious to those skilled in the art that function blocks can be implemented in a variety of ways, just by hardware, just software, or a combination of them.
[0027] The input / output control unit 20 is connected to the touch panel 12, the display 14, the content memory 16, and the GUI image memory 18 using a known method, and controls input / output data . An input signal received from the touch panel 12 is the coordinates of a contact point at which the user touched the touch panel 12, the path of movement of the coordinates when the contact point moves continuously, or similar. Since a method for detecting contact points on the touch panel 12 differs depending on the type of the panel, no reference is made to the method. In addition, the input / output control unit 20 transmits a video signal from a display image to the display 14.
[0028] In addition, the input / output control unit 20 accesses the content memory 16 and reads a program or a variety of data types required for processing content. In addition, the input / output control unit 20 accesses the GUI image memory 18 and reads the clip-art graphics data from the directional keys, buttons, or the like described above. The types of "content" are not limited, as long as the content can be processed and represented by a computer, such as a computer game, a movie, a song, a book, a photo, etc. The modality may be applicable for processing general information, such as communication, programming management, an address book, a spreadsheet, or similar, in addition to general "contents". The "contents" in the following explanation include all the contents mentioned above.
[0029] Content memory 16 stores, in the case that the content is from a game, information in a program of the same, information in a player, a level reached when the game was last played, or similar. In the event that the content is from a movie or song, the content memory 16 stores compressed and encoded video data, audio data, a program for decoding and reproducing the data, or the like. The content memory 16 can be a hard disk drive or it can be a combination of a removable recording medium (for example, a memory card, a ROM disk, an optical disk, a magneto-optical disk, or similar ) and a reader of them.
[0030] GUI image memory 18 is a memory (for example, a hard drive or similar) that stores image data that can be used as clip art graphics of the GUI image, such as directional keys, the variety of button types, or the like. As will be described later, an image of the GUI itself can be changed as a result of an operation on the GUI, according to the modality. Therefore, GUI image memory 18 stores image data corresponding to these comprehensive variations of GUIs.
[0031] The operating information converter unit 22 acquires a signal input through the touch panel 12 of the input / output control unit 20 and converts the coordinates of the contact point or similar included in the signal for information on details of operation. As described above, according to the modality, a GUI itself can be changed as a result of an operation on the GUI. Therefore, the correspondence between the type of a GUI and a position to display the GUI is stored inside, in advance. Then, based on the coordinates acquired from the contact point and the type of a GUI currently being displayed, details of operation intended by a user (for example, the type of a button pressed, the degree and / or the direction of the operation, or similar) are identified.
[0032] The information on the details of operation are notified respectively to the content processing unit 24 and the image generating unit GUI 26. The content processing unit 24 performs, according to a program stored by the content memory 16, a process required to proceed with a game, or to reproduce a moving image and / or music according to the details of operation. An actual processing procedure is as similar as one commonly performed according to the details of the content.
[0033] The GUI 26 image generating unit generates a new image of a GUI as the need arises based on the details of operation, and stores the image in the GUI image buffer 28. Although specific examples of changing a GUI image will be given later, for example, colors are changed or a button is shown as if it were pressed, in order to indicate that a user makes contact with the touch panel, or a key and / or a button itself is replaced by that of another GUI.
[0034] Therefore, the GUI 26 image generating unit stores information that associates the details of operation and a change to be made in an image, information identifying an image to be used again, or similar. The GUI 26 imaging unit then reads image data from a required GUI from the GUI 18 image memory as appropriate, and generates data from a new image on the screen, so that a change associated with a operation that was done in a GUI currently being displayed is represented.
[0035] Depending on the details of the operation, in case a GUI does not need to be changed, the GUI 26 image generating unit does not need to perform a process of generating a new image. The content image generating unit 30 generates data from an image to be produced as a result of processing performed by the content processing unit 24, and stores the data in the content image buffer 32 accordingly.
[0036] Performing rendering processing using image data stored in the GUI image buffer 28 and image data stored in the content image buffer 32, the image synthesizer unit 34 generates an image in which a GUI image is displayed as an on-screen display over the content image, and stores the image in an internal frame buffer accordingly. By allowing a video signal corresponding to the image stored in the frame buffer to be transmitted to the display 14 under the control of the input / output control unit 20, a corresponding image for a GUI operation performed by a user is displayed on the display 14.
[0037] Below, a specific example of the GUI screen image according to the modality will be presented. According to the modality, a GUI screen image is displayed as an on-screen display in a content screen image, such as a game, or similar. Therefore, it is important to be able to acquire operability similar to that of a game console controller, or similar, without harming the content screen image.
[0038] According to the modality, a joystick is recreated as a GUI in addition to directional keys and operation buttons in number of four. In this process, by configuring the recreated joystick to allow a user to enter an arbitrary direction and an arbitrary amount in a manner similar to that of an original joystick, the user can use the joystick or directional keys according to the circumstances when instructing direction in a manner similar to that of an original controller. This GUI makes it possible to input an arbitrary direction and an arbitrary amount, and can be used to control the moving direction and / or the speed of movement of a character that appears in a game, rotating a field of view, etc. In the following explanation, an entry made using a pseudo-joystick, which is represented as a GUI, is also referred to as a “joystick entry”.
[0039] If the three types of GUI described above, or more than three GUIs are arranged on a GUI screen image as it is, the GUIs can impair the visibility of a content screen image. Therefore, according to the modality, providing a combined GUI area that is a combination of a plurality of GUIs, the plurality of GUIs share the same detection area. Depending on the position of a contact point in the combined GUI area, when a user touches the area again, the entire combined GUI area is determined to be used as that of the GUIs.
[0040] FIG. 4 shows an example layout of the GUIs in a GUI screen image. All GUI screen images 50a, 50b, 50c, 50d, and 50e are a screen image that is displayed as an on-screen display in a content image on the display 14. An operation performed by a user switches the screen image to be displayed between the GUI screen image 50a, and one of the GUI screen images 50b, 50c, 50d, and 50e. The GUI screen image 50a is a standard screen image, and displays a first combined GUI area 52 and a second combined GUI area 56 respectively at the bottom left and bottom right of the screen image.
[0041] The first combined GUI area 52 is an area for a GUI which is a combination of a GUI of the directional keys and a GUI of a joystick, and has the same design as that of the directional key input area 42 shown in fig . 2. The first combined GUI area 52 is configured with a graph of directional keys 51 which is at least part of a graph of the GUI of directional keys, and a graph for joystick 53 which is represented by a figure (for example, a circle or similar) in the center of the directional switch graph 51 and which is at least part of a graph of the joystick GUI.
[0042] The second combined GUI area 56 is an area for a GUI that is a combination of an operation button GUI in number of four and a joystick GUI, and has the same design as that of the button input area 44 shown in fig. 2, in a similar manner as that of the first combined GUI area 52. The second combined GUI area 56 is configured with an operation button graphic 55, which is at least a part of a GUI graphic of the operation buttons in number of four, and a graphic for joystick 57, which is represented by a figure (for example, a circle or the like) in the center of the operation button graphic 55, and which is at least a part of the graphic of the joystick GUI.
[0043] On the GUI screen image 50a, which is a standard screen image, if a user again touches the graphic for joystick 53 of the first combined GUI area 52, a process of receiving an entry via the joystick is initiated, and the first Combined GUI area 52 is switched to a joystick input area 58 that does not include a graph of directional keys 51 (GUI screen image 50b). More specifically, the process is performed by a continuous movement, that is, if the user places a thumb or another finger on the graphic for joystick 53 of the first combined GUI area 52, the area is switched to the joystick entry area 58, and by sliding your thumb or other finger on the touch panel without removing it, the moving direction and moving distance of the thumb or other finger are acquired as an input value.
[0044] During a period of time when the thumb or the other finger remains touching continuously, the area works as the input area of joystick 58 and acquires an input value from the movement of the thumb or the other finger, sequentially. If the user removes the thumb or the other finger, the area switches back to the first combined GUI area 52 (GUI screen image 50a).
[0045] On the other hand, in the GUI screen image 50a, if a user touches the directional keys graph 51 of the first combined GUI area 52 again, a process of receiving an entry through the directional keys is initiated, and the first GUI area combined 52 is switched to a directional key input area 42 (GUI screen image 50c). Likewise, in this case, for a period of time when the thumb or the other finger remains touching continuously, the area works as the directional key entry area 42 and receives input via the directional keys, and when the user withdraws the thumb or the other finger, the area switches back to the first combined GUI area 52 (GUI screen image 50a).
[0046] However, the graphic for joystick 53 can be kept displayed also in the directional key input area 42, so that the first combined GUI area 52 and the directional key input area 42 appear similarly, as shown in fig. 4. This eliminates the inconvenience of a disappearance or reappearance of the graph for joystick 53, even when the directional keys are touched in a discontinuous way for insertions using the directional keys.
[0047] The second combined GUI area 56 also works in a similar way, that is, if a user touches the graphic for joystick 57 again, a process of receiving an entry through the joystick is initiated, and the area is switched to the area joystick input 58 which does not include a four-button operation button graphic 55 (GUI screen image 50d), and if the thumb or other finger is removed, the area is switched back to the second combined GUI area 56 (GUI 50a screenshot). While the thumb or the other finger remains touching, the area works as the joystick entry area 58 and tracks the movements of the thumb or the other finger.
[0048] On the other hand, if a user touches the four button operation button graphic 55 of the second combined GUI area 56 again, a process of receiving an input via the buttons is initiated, and the second combined GUI area 56 is switched to button 44 input area (GUI 50e screenshot). Then, if the user removes the thumb or the other finger, the area switches back to the second combined GUI area 56 (GUI screen image 50a). Also in this process, the graphic for joystick 57 can be kept displayed in the button input area 44 for the same reason as that of the directional key input area 42.
[0049] In this way, enabling switching between the input area of directional keys 42 and the input area of joystick 58, and between the input area of button 44 and the input area of joystick 58, respectively, the size of a area occupied by a GUI image can be reduced. As a result, even when displayed as an on-screen display, the obstacle to a content screen image is small, and the content screen image and a GUI can coexist in a limited space. Furthermore, visualizing that a stick is tilted in a desired direction when the center remains as a starting point in the case of a real joystick, associating the movement of sliding a thumb or another finger from the center of an area with the activation of an input via the joystick, an action required for switching is natural.
[0050] Furthermore, by configuring both the first combined GUI area 52 on the lower left side and the second combined GUI area 56 on the lower right side to be switchable to the joystick entry area 58, a combination of data entry means that are displayed concurrently can be changed, or a hand to be used for operation (ie, a left hand or a right hand) can be determined in a flexible and quick manner, depending on the type of content, details or a scene of a game, etc.
[0051] In the example above, the GUIs of the directional key input area 42 and the button input area 44 are sufficiently implemented by a GUI for on / off input that switches a function provided for each button to be turned on or off via contact or non-contact with a separate region representing a commonly used button. Thus, GUIs are not intended to be limited to direction instruction keys, a circular button, a cross button, a square button, and a triangular button. The joystick input area GUI 58 is also sufficiently implemented by a GUI for analog input that receives an analog value according to a contact position in a given area, and therefore is not intended to be limited to the joystick function. .
[0052] In both cases, a GUI for an on / off input and a GUI for an analog input are combined into a combined GUI. In the combined GUI, the graph of the GUI for on / off input is displayed unchanged, and an instruction for switching to GUI for on / off input is received. Concomitantly, a detection area for switching to the GUI for analog input is defined as a small area, and the detection area is expanded after switching. This enables a combined GUI switching operation for respective GUIs, and an operation on the respective GUIs to be performed naturally in a flow of a series of operations. In this process, when switching to the GUI for on / off input, the graph for switching to the GUI for analog input, which was displayed in a part of the combined GUI, is kept displayed. This eliminates the inconvenience of a disappearance and a reappearance of the graph over a period of time when the GUI for on / off input is operated in a discontinuous manner.
[0053] In addition, a toggle button 54 can be displayed on the GUI 50a screen image. Contacting this toggle button 54 may allow for a button that has not been displayed (for example, a button required according to a function, such as a select button for selecting a menu item, a menu item button , a start button to start playback of a moving image, music, or similar) appears from the bottom of the screen image. By configuring the mode in this way, a button that is less used can be hidden, and the content screen image can be configured so that it is easily visible.
[0054] FIG. 5 is a diagram illustrating an image to be displayed in a first combined GUI area 52 and a detection area on a touch panel. The left part of fig. 5 is an image of the first combined GUI area, and the part on the right is an image in which detection areas are shown in the image. The first combined GUI area 52 is configured with a circle 60, directional switch graphic 62 comprising four keys indicating four directions (up, down, left and right) that are arranged on the circumference of circle 60, and a graphic for joystick 53 arranged in the center of circle 60.
[0055] The circle 60 to be displayed in the first combined GUI area 52 is also displayed in the directional key entry area 42 and is to represent a sense of unity of the two-dimensional set of four keys, which an original controller can display by its shape three-dimensional. Displaying this type of circle, even when a content image displayed as a background is less orderly, the set of keys can easily be perceived as a GUI.
[0056] Circle 60 is configured to be able to be displayed in a partially translucent manner, so that the circle does not block the content screen image. The transmittance is configured so that the user can define the transmittance while taking into account the types of content, or similar. Also showing a similar circle in the second combined GUI area 56 and in the button input area 44, a sense of unity as a set is represented. The color arrangement of the four-way directional keys or operation buttons is preferably a monotone base, in order to prioritize colors of an image in the content screen image.
[0057] The four keys of the directional key graph 62 are associated respectively with rectangular detection areas 64a, 64b, 64c, and 64d of predetermined size that surround graphic symbols of respective keys on the touch panel. In addition, the graph for joystick 53 is associated with a detection area 68 positioned in the center of circle 60. Defining the format of the detection areas 64a, 64b, 64c, and 64d as a format that can be defined by a mathematical expression (for example, example, a triangle, a circle, or similar, in addition to the rectangle), the areas can be readily associated with the display of keys, regardless of the resolution of the display and / or the touch panel.
[0058] Furthermore, by setting the size of the detection areas 64a, 64b, 64c, and 64d on the large side to include areas surrounding graphic symbols of respective keys, an operation can be detected, even if a point of contact with a thumb or another finger is slightly deviated from the key, and, concomitantly, overlapping areas 66a, 66b, 66c, and 66d can be provided for adjacent detection areas. These overlapping areas 66a, 66b, 66c, and 66d are associated with four diagonal directions, which are intermediate directions of four directions, that is, up, down, left, and right. This doubles the number of directions that can be entered compared to the case where only a part of each key is fixed as a detection area, thus allowing a user to instruct a direction at multiple levels.
[0059] In the case of the directional keys input area 42, the detection areas corresponding to the directional keys are defined in a similar way to that shown in fig. 5. However, an area to detect whether a user remains continuously tapping or not, which is a criterion when switching from the directional key input area 43 back to the first combined GUI area 52, is determined separately, for example, as a predetermined concentric circle having a radius equal to or greater than that of circle 60. This prevents, in the event that a thumb or other finger of a user entering information via the directional keys moves through the center portion of the directional key entry area 42, movement back to the first combined GUI area 52 despite user intent, or switching back to joystick entry area 58. In the event of a contact point moving away from the detection area , the directional key input area 42 is switched to the first combined GUI area 52.
[0060] FIG. 6 is a diagram illustrating an image to be displayed in a joystick input area 58 and a method of operating it. As described above, the joystick entry area 58 is displayed for a period of time when a user slides a thumb or other finger on the touch panel without removing it from the panel, after touching the detection area 68 of the graph to joystick 53 from the first combined GUI area 52 or similar. In this process, as shown in the left part of fig. 6, an index 70 (e.g., a circle or the like) is displayed at the position where the user's thumb or other finger 76 touches. The index 70 preferably has a size and shape so that it is not hidden by the thumb or the other finger. The indicator 70, moreover, will be easily perceptible by applying a shiny image processing to an area surrounding the indicator, slightly changing the position to follow the movement of the contact position, or showing the trail faintly.
[0061] The indicator 70 moves according to the movement of the point in contact with the thumb or finger 76. Also in the joystick entry area 58, a circle 72 with the same radius is displayed in the same position with the circles shown in the first combined GUI area 52, in the second combined GUI area 56, in the directional key input area 42, and in the button input area 44. Since the joystick graphics are provided in the first combined GUI area 52 and the second combined GUI area 56 in its central portions, the center of the circle is located in a position where a user first touches for a joystick entry. Furthermore, as shown on the right side of fig. 6, detection area 79 is defined as a concentric circle with a radius equal to or greater than that of circle 72. If the contact point departs from detection area 79, the area is switched back to the first GUI area original 52 or back to the second 56 combined GUI area.
[0062] As is obvious from fig. 5 and fig. 6, the detection area 79 that receives a joystick entry in the joystick entry area 58 is an area that is a concentric expansion of the detection area 68 to switch to the joystick entry area 58 in the first combined GUI area 52. The same is applied to the relationships between a detection area to switch to the joystick entry in the second combined GUI area 56 and a detection area after switching.
[0063] Circle 72 may be displayed in a partially translucent manner, or it may not be displayed. In the joystick entry area 58, the coordinates of the contact point 78 are acquired continuously at predetermined time intervals, and a direction vector from the center of circle 74 from the detection area 79 to the contact point 78 is fixed as a value. input for each time point. The time interval for acquiring the coordinates is defined, for example, shorter than the time interval for displaying frames on the display 14.
[0064] The operating information converter unit 22 stores information that associates the direction from the center of the circle 74 to the contact point 78 of a direction vector and an inclination direction of an original joystick with each other, and the distance from the center 74 to the contact point 78, and the amount of inclination of the original joystick with each other. Referring to the information, the operating information converter unit 22 converts a direction vector from each time point into a tilt direction and a tilt amount of the original joystick, and notifies the content processing unit 24 thereof. This makes it possible for the content processing unit 24 to perform processing in a manner similar to a case where an entry is made through an original joystick. Alternatively, the direction or distance can be used directly to process content.
[0065] If the contact point reaches the circumference of circle 72, an input value to be obtained is defined as an input value for a fully tilted joystick. If the contact point departs from circle 72, as long as the contact point is within the range of the detection area 79, only the direction from the center of circle 74 is acquired according to the contact point, and the distance is defined as becoming saturated within the radius of circle 72, regardless of the point of contact. In this process, the indicator 70 is displayed in order to move according to the direction of the contact point on the circumference of the circle 72. In this way, providing the joystick entry area 58, a means of data entry with a small difference in a method of operation and operability with an original joystick can be implemented on a flat surface. In addition, by disabling the display of circle 72, or displaying circle 72 in a partially translucent manner, an effect on a content image screen can be minimized while an arbitrary direction and an arbitrary amount can be entered.
[0066] FIG. 7 shows a variation of the GUI screen image. A GUI screen image 50f of the information processing device 10 in fig. 7 includes a first combined GUI area 52 at the bottom left in the screen image, and a second combined GUI area 56 at the bottom right in the screen image, in a manner similar to the GUI screen image 50a shown in fig. 4. The GUI 50f screen image also includes an L1 / L2 80 button input area at the top left of the screen and an R1 / R2 82 button input area at the top right of the screen. The L1 / L2 80 button input area is configured to include buttons corresponding to the Lí button and the L2 button shown in fig. 1. The R1 / R2 82 button input area is configured to include buttons corresponding to the RI button and the R2 button shown in fig. 1.
[0067] According to the modality, it is assumed that a user operates a GUI with a thumb and another finger of the left and right hands 84 while holding the main body of the information processing device 10 with the left and right hands 84 , as shown in fig. 7. Therefore, arranging the first combined GUI area 52 and the second combined GUI area 56 on the bottom left and bottom right on the screen image allows a thumb to manipulate the areas.
[0068] In addition, arrange the L1 / L2 80 button input area and the R1 / R2 82 button input area on the upper left and upper right on the screen image as shown in fig. 7 allows manipulation of an index finger. Although fig. 7 show the information processing device 10 and hands 84 separately so that the GUI 50f screen image is easily viewed, while holding the device in practice, the thumbs and index fingers are positioned on the GUI 50f screen image.
[0069] By exposing the GUIs in this way, a user can operate the information processing device 10 without distortion while holding the device, and in addition, the user can operate two areas or more than two areas of four areas concurrently. The L1 / L2 80 button input area and the R1 / R2 82 button input area are formed in the form of circle sectors, through which the center angles of the respective circle sectors are the right angles of the two upper corners of the screen, and internally dividing the center angles in order to divide the sectors into two sectors respectively enables discrimination between the LI button and the L2 button, and between the RI button and the R2 button, as shown in Fig- 7.
[0070] In case of holding the information processing device 10 in the format as shown in fig. 7, the bases of the index fingers 84 are typically placed opposite the middle fingers so as to form shapes to compress the housing of the information processing device 10. Therefore, extensions where the index fingers can operate the touchscreen without distortion are in the shapes of circle sectors, which are formed by bending the upper parts from the joints of the index fingers. By defining the formats of the L1 / L2 button input area 80 and the R1 / R2 82 button input area as circle sectors, a user can touch the L1 / L2 button and the R1 / R2 button in a distinguishable manner simply by changing the angle of curvature produced by the fingers. More than two buttons can be provided by changing the number of partitions in the conformed areas.
[0071] The angles at which the original angle is internally divided cannot be the same. For example, taking into account a feature that the smaller the angle of curvature produced by a finger, the easier the finger can be controlled, the angle of a sector of a button at the top of the screen can be set small, and the angle a sector of a button in a position closer to the left or right side of the screen can be set higher. Instead of the shape of the buttons, the button layouts can be configured to suit the range of finger movement.
[0072] FIG. 8 shows a variation of the L1 / L2 button input area. Although three buttons, ie a LI 88a button, an L2 button 88b, and an L3 button 88c are provided in the L1 / L2 / L3 86 button input area shown in fig. 8 as an example, the number of buttons is not limited. On the one hand, the shape of each button is fixed independently of the other, on the other hand, the buttons are arranged to form an arc shape. Although in fig. 8 the shape of each button is fixed as a circular shape, the shape can be a rectangle or similar. In this way, even though the shape of the buttons is similar to that of common cases, having buttons in an arc shape, a user can touch buttons without distortion in a distinguishable way, depending on the angle of curvature produced by an index finger. Also in this example, the farther the position of a button is from the top of the screen and the closer it is to the left side, the greater the spacing between the button and an adjacent button can be fixed. The same applies to buttons in an opposite position, such as the R1 / R2 button shown in fig. 7, or similar.
[0073] NNBA fig. 9 shows another example layout of the GUI screen image and a content screen image. In the mode described above, a GUI screen image is displayed as an image on the content screen screen. In the example shown in fig. 9, the information processing device 10 is used while it is oriented vertically long, and a screen image of content 100 and a screen image GUI 102 are displayed in separate areas. Also in this case, the information processing device 10 can be configured in a manner similar to that shown in fig. 3, and the image synthesizer unit 34 performs synthesis by rendering the content image and the GUI image in separate predetermined areas, respectively. Methods of operating a variety of types of GUIs are also implemented in a similar manner as described above.
[0074] In this case, although the area size for displaying the content screen image 100 becomes smaller, a GUI screen image does not hide the content screen image 100, thus the directional key entry area, the button entry area, joystick entry area, or the like, can be displayed concurrently without major inconvenience. In addition, this case has a high probability that the shape of a hand holding the device and / or a thumb or other finger that operates the entry areas differs from the case described above. Therefore, the Ll button, the L2 button, the R1 button and the R2 button may not be in a circle sector shape, arranged in an arc shape, or similar.
[0075] Information processing device 10 is configured so that a user can define whether to use device 10 while it is oriented vertically long or while it is oriented horizontally long, taking into account the type of content, the type of a particular game , or similar. Fig. 10 shows an example configuration screen image of the GUIs display. A GUI 90 configuration screen image is fixed so that, either before starting content processing or in the middle of the process, a user can call up the GUI 90 configuration screen image at any time by tapping a predetermined button on a GUI, and the GUI 90 configuration screen image is displayed as an additional screen display. Alternatively, a menu screen can be called up first, and then the GUI 90 configuration screen image can be called up by selecting from the menu.
[0076] The GUI 90 configuration screen image includes a configuration detail display field 92, a transmission factor defining bar 94, and a confirmation / cancel button 96. The configuration detail display field 92 displays the details of a configuration, such as a) the orientation of the information processing device 10; b) a mode for determining whether the joystick entry area 58 is switched with the directional key entry area 42 and / or the button entry area 44, or whether the joystick entry area 58 is regularly displayed; c) whether a circle displayed in the directional key entry area 42 and / or the button entry area 44 is displayed or not in a partially translucent manner, and the color of the circle; or similar. The transmission factor defining bar 94 defines the transmittance of the circle in case the circle is displayed in a partially translucent manner. The confirm / cancel button 96 allows a user to confirm the configuration detail or cancel a configuration change.
[0077] In the example of the configuration detail display field 92 shown in fig. 10, a configuration is defined, in which the information processing device 10 is used is used while it is oriented horizontally long, and a mode is adapted, in which the joystick entry area 58 is switched with the key entry area directional signs 42 and / or the button entry area 44, and the circles included in the GUIs are fixed so as to be displayed in a partially translucent manner.
[0078] Calling the GUI 90 configuration screen image when a user wants to change a configuration and touching the direction instruction button 98 displayed in the configuration detail display field 92 for each item, the current configuration that was displayed is switched to display the other candidate. If a desired setting is displayed, the user confirms the setting by touching the confirmation button included in the confirm / cancel button 96. In case of displaying the circles in a partially translucent way, the user also adjusts the transmission factor defining bar 94 as appropriate.
[0079] In this way, allowing a user to easily change the configuration of the display mode of a GUI, the information processing device can be operated in an optimal environment according to the type of content, the details or a scene of a game, or similar, and according to the preference of individual users. Therefore, even when the device is horizontally oriented it is horizontally oriented long, the directional key input area 42, the button input area 44, and the joystick input area 58 can be concurrently displayed easily, or the shape of the area button input area L1 / L2 80 and the button input area R1 / R2 82 can be easily fixed as a rectangle.
[0080] According to the mode described above, a GUI screen image is displayed as an on-screen display on an information processing device display screen. The GUI displayed in this process is a two-dimensional representation of directional keys, a variety of types of buttons, a joystick or similar that have been formed three-dimensionally like a conventional controller of a game device. In this way, even a user who starts using a three-dimensional controller can easily operate with a similar operating medium and similar operability.
[0081] The image of each GUI is arranged next to a corner of the screen, so that the area of a region hidden in a content output image becomes small. In addition, by providing a combined GUI area, and enabling switching to individual GUIs depending on which graph of the GUI's includes a point from which contact is initiated by a user, the same detection area is shared by a plurality of GUIs. In addition, a GUI, which can readily call a GUI that is basically defined as a non-displayed GUI, is provided. For these characteristics, a content screen image and a GUI screen image can co-exist naturally.
[0082] As a way to provide a combined GUI area, a graphic of a joystick is provided in a central portion of a graphic configured with directional keys or four buttons, and sliding a thumb or another finger from a point in the portion central where contact is initiated, an input from an arbitrary direction and an arbitrary amount can be implemented. These continuous movements resemble a manipulation of placing a thumb or another finger on an original joystick input device and tilting the stick, thus bringing less discomfort to a user when switching GUIs or in operation.
[0083] In addition, having GUIs on the bottom left, bottom right, top left, and top right on a screen image in correspondence with the positions of thumbs and fingers that hold the information processing device makes it possible a natural operation using thumbs and / or index fingers on both hands. In addition, forming a plurality of buttons arranged in the upper left and upper right, which are likely to be operated by index fingers, as a shaped area of a continuous circle sector, or arranging the plurality of buttons in an arc shape while if it takes into account the extent of movement of the index fingers, a user can touch the buttons in a distinguishable manner without distortion.
[0084] Allowing a user to configure the orientation of the information processing device, whether or not to switch GUIs, or whether or not a part of an image constituting a GUI is displayed in a partially translucent manner, an operating environment that agrees with the type of content, a user’s preference, or similar, can be readily implemented.
[0085] An explanation was given above based on the example modalities. These modalities are intended to be illustrative only, and it will be obvious to those skilled in the art that various modifications to constituent elements and processes could be developed, and that these modifications are also within the scope of the present invention.
权利要求:
Claims (11)
[0001]
1. Input device (20, 22, 26, 28), comprising: a display device (14) placed with an information processing device (24, 30, 32) in a housing in an integrated manner; and a touch panel (12) covering the display (14), the input device converting the information into a point of contact with an index finger or thumb of a hand that is detected by the touch panel in an operation performed by a user and allows the information processing device to process the information, where the display device displays an image on the GUI (Graphical User Interface) screen as an image superimposed on the screen in an output image generated as a result of processing of information carried out by the information processing device, GUIs being arranged on the image on the GUI screen in correspondence with a position of a index finger or thumb of a user, when the user holds the housing on both sides so that a index finger or thumb is placed on the upper surface of the display device, opposite the middle finger that supports the housing from the bottom; characterized by the fact that the thumbnail image on the GUI screen includes a combined GUI area including a first GUI graphic and a second GUI graphic representing at least first and second GUI operation areas, respectively, where : (i) the second graph of the GUI includes a graph of the directional keys corresponding to the second area of GUI operation, which includes directional keys, in which a plurality of direction instruction keys are arranged on the circumference of a circle; (ii) the first GUI graph includes a joystick graph, which corresponds to the first GUI operation area, which includes a pseudo-joystick receiving input from an arbitrary direction and an arbitrary quantity; and (iii) the joystick graph is arranged in the center of the directional switch graph in the combined GUI image area; where when the user selects the first GUI graphic in the combined GUI image area, the first GUI graphic representing the first GUI operation area is displayed in place of the combined GUI image area; and when the user selects the second GUI graphic in the combined GUI image area, the second GUI graphic representing the second GUI operation area is displayed in place of the combined GUI image area.
[0002]
2. Input device (20, 22, 26, 28) according to claim 1, characterized by the fact that among GUIs, a GUI arranged in the position of an index finger has a circular sector shape that corresponds to a band of movement of an upper part from a joint of the index finger.
[0003]
3. Input device (20, 22, 26, 28) according to claim 2, characterized by the fact that the GUI arranged in the position of the index finger receives an individual entry for each of a plurality of sector-shaped areas circular formed by internally dividing the central angle of the circular sector.
[0004]
4. Input device (20, 22, 26, 28) according to claim 3, characterized by the fact that the plurality of areas of circular sector is formed so that the smallest angle of curvature produced by the index finger operating in the area is the smallest central angle of the area.
[0005]
5. Input device (20, 22, 26, 28) according to claim 1, characterized by the fact that among the GUIs, the GUI displayed in the position of an index finger is a GUI including a plurality of buttons arranged so to form an arc shape that corresponds to a range of motion of an upper part from a joint of the index finger.
[0006]
6. Input device (20, 22, 26, 28) according to claim 5, characterized by the fact that the plurality of buttons is arranged so that the smallest angle of curvature produced by the index finger that operates a button is greater than the spacing between the button and the adjacent buttons.
[0007]
7. Input device (20, 22, 26, 28) according to claim 1, characterized by the fact that the circles corresponding to the corresponding circumferences are displayed in a partially translucent way in the GUI to specify a direction, and the GUI for operate buttons respectively.
[0008]
8. Input device (20, 22, 26, 28) according to claim 1, characterized by the fact that: the GUI to specify a direction detects an operation performed on each direction instruction key by a detection area that surrounds the graphic symbol of the direction instruction key, and an area where the detection areas for adjacent direction instruction keys overlap each other, are defined as a detection area for one-way operation between the two directions corresponding to the adjacent direction instruction keys.
[0009]
9. Information processing device (24, 30, 32), comprising: an operational information processing unit to perform information processing according to an operation performed by a user; a display device (14) operative to display an output image generated as a result of information processing on the information processing device, the upper surface of the sense display device covered by a touch panel; and an operative housing for storing the information processing device and the display device in an integrated manner, wherein the display device displays an image on the GUI (Graphical User Interface) screen as a superimposed image on the screen, GUIs being arranged on the screen image of the GUI in correspondence with a position of an index finger and a thumb of a user when the user holds the housing on both sides so that an index finger and thumb are placed on the surface of the display device, opposite the middle finger that supports the housing from the bottom, and the information processing unit converts information into a point of contact with an index finger or thumb of a hand on the GUI that is detected by the touch panel inward an operation performed by a user and performs information processing; characterized by the fact that: where the thumbnail image on the GUI screen includes a combined GUI area including a first GUI graph and a second GUI graph representing at least first and second GUI operating areas, respectively, where: (i) the second graph of the GUI includes a graph of the directional keys corresponding to the second area of GUI operation, which includes directional keys, in which a plurality of direction instruction keys are arranged on the circumference of a circle. (ii) the first GUI graph includes a joystick graph, which corresponds to the first GUI operation area, which includes a pseudojoystick receiving input from an arbitrary direction and an arbitrary amount; and (iii) the joystick graph is arranged in the center of the directional switch graph in the combined GUI image area; where when the user selects the first GUI graphic in the combined GUI image area, the first GUI graphic representing the first GUI operation area is displayed in place of the combined GUI image area; and when the user selects the second GUI graphic in the combined GUI image area, the second GUI graphic representing the second GUI operation area is displayed in place of the combined GUI image area.
[0010]
10. Input value acquisition method, comprising: displaying a GUI (Graphical User Interface) in display data, which is placed with an information processing device in a housing in an integrated manner, such as an image superimposed on the screen in an output image generated as a result of information processing; converting the information to a point of contact such as an index finger or thumb of a hand that is detected by the touch panel covering the display device in a GUI operation performed by a user; and allowing an information processing device to perform information processing according to the operation, in which the display as a superimposed image on the screen displays GUIs as a screen display by arranging the GUIs in correspondence with the positions of an index finger and a user's thumb when the user holds the housing on both sides of the housing so that an index finger or thumb is placed on the top surface of the display device, opposite the middle finger that supports the housing from the bottom side ; characterized by the fact that: where the thumbnail image on the GUI screen includes a combined GUI area including a first GUI graph and a second GUI graph representing at least first and second GUI operating areas, respectively, where: (i) the second graph of the GUI includes a graph of the directional keys corresponding to the second area of GUI operation, which includes directional keys, in which a plurality of direction instruction keys are arranged on the circumference of a circle. (ii) the first GUI graph includes a joystick graph, which corresponds to the first GUI operation area, which includes a pseudojoystick receiving input from an arbitrary direction and an arbitrary amount; and (iii) the joystick graph is arranged in the center of the directional switch graph in the combined GUI image area; where when the user selects the first GUI graphic in the combined GUI image area, the first GUI graphic representing the first GUI operation area is displayed in place of the combined GUI image area; and when the user selects the second GUI graphic in the combined GUI image area, the second GUI graphic representing the second GUI operation area is displayed in place of the combined GUI image area.
[0011]
11. Non-transitory computer-readable recording media, containing computer-readable instructions, instructions being executed by, and causing a computer to perform one or more actions, comprising: a module configured for display, which is placed with a computer on a hosting in an integrated manner, a GUI (Graphical User Interface) on a display device as an image superimposed on the screen in an output image generated as a result of information processing; a module configured to convert information into a point of contact with an index finger or thumb of a hand that is detected by the touch panel covering the display device in a GUI operation performed by a user; and a module configured to perform information processing according to the operation, in which the module configured to display as a superimposed image on the screen by arranging the GUIs in correspondence with positions of a user's index finger and thumb when the user holds the housing and both sides so that an index finger and a thumb are placed on the upper surface of the display data, opposite the middle finger that supports the lower housing; characterized by the fact that: where the thumbnail image on the GUI screen includes a combined GUI area including a first GUI graph and a second GUI graph representing at least first and second GUI operating areas, respectively, where: (i) the second graph of the GUI includes a graph of the directional keys corresponding to the second area of GUI operation, which includes directional keys, in which a plurality of direction instruction keys are arranged on the circumference of a circle. (ii) the first GUI graph includes a joystick graph, which corresponds to the first GUI operation area, which includes a pseudojoystick receiving input from an arbitrary direction and an arbitrary amount; and (iii) the joystick graph is arranged in the center of the directional switch graph in the combined GUI image area; where when the user selects the first GUI graphic in the combined GUI image area, the first GUI graphic representing the first GUI operation area is displayed in place of the combined GUI image area; and when the user selects the second GUI graphic in the combined GUI image area, the second GUI graphic representing the second GUI operation area is displayed in place of the combined GUI image area.
类似技术:
公开号 | 公开日 | 专利标题
BR102012002995B1|2020-09-15|ENTRY DEVICE, INFORMATION PROCESSING DEVICE, ENTRY VALUE ACQUISITION METHOD, AND, LEGIBLE RECORDING MEDIA BY NON-TRANSITIONAL COMPUTER
CA2765913C|2015-01-06|Method and apparatus for area-efficient graphical user interface
JP5363259B2|2013-12-11|Image display device, image display method, and program
US7844916B2|2010-11-30|Multimedia reproducing apparatus and menu screen display method
JP4741983B2|2011-08-10|Electronic device and method of operating electronic device
JP5460679B2|2014-04-02|Information processing apparatus, information processing method, and data structure of content file
JP2004295159A|2004-10-21|Icon display system and method, electronic equipment, and computer program
JP5647968B2|2015-01-07|Information processing apparatus and information processing method
WO2012066591A1|2012-05-24|Electronic apparatus, menu display method, content image display method, function execution method
JP5637668B2|2014-12-10|Information processing apparatus, information processing program, and information processing method
JP5295839B2|2013-09-18|Information processing apparatus, focus movement control method, and focus movement control program
JP2011036424A|2011-02-24|Game device, game control program and method
WO2013038605A1|2013-03-21|Information processing device, information processing method, content file data structure, gui placement simulator, and gui placement setting assistance method
JP5474669B2|2014-04-16|Terminal device
WO2011158701A1|2011-12-22|Terminal device
JP5570881B2|2014-08-13|Terminal device
JP2013012158A|2013-01-17|Electronic apparatus and control method
JP5669698B2|2015-02-12|GUI placement simulator and GUI placement setting support method
JP5382880B2|2014-01-08|GAME PROGRAM, GAME DEVICE, GAME CONTROL METHOD
JPH11203038A|1999-07-30|Portable terminal
WO2006100811A1|2006-09-28|Information processing device, image move instructing method, and information storage medium
JP2013061803A|2013-04-04|Information processing device, information processing method, and content file data structure
同族专利:
公开号 | 公开日
US20130031515A1|2013-01-31|
BR102012002995A2|2013-07-30|
RU2012104742A|2013-08-20|
RU2519059C2|2014-06-10|
US9122394B2|2015-09-01|
MX2012001547A|2012-08-30|
JP2012168931A|2012-09-06|
EP2487575A2|2012-08-15|
EP2487575B1|2020-06-24|
EP2487575A3|2015-12-09|
CN103150102A|2013-06-12|
JP5379250B2|2013-12-25|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JPH01149120A|1987-12-07|1989-06-12|Fuji Xerox Co Ltd|Information processor|
JPH06103018A|1992-09-21|1994-04-15|Toshiba Corp|Display system|
JPH07200243A|1993-12-29|1995-08-04|Casio Comput Co Ltd|Icon selection controller|
US6037937A|1997-12-04|2000-03-14|Nortel Networks Corporation|Navigation tool for graphical user interface|
US6278443B1|1998-04-30|2001-08-21|International Business Machines Corporation|Touch screen with random finger placement and rolling on screen to control the movement of information on-screen|
JP2000056877A|1998-08-07|2000-02-25|Nec Corp|Touch panel type layout free keyboard|
JP3872322B2|2001-09-17|2007-01-24|インターナショナル・ビジネス・マシーンズ・コーポレーション|Input method, input system and program corresponding to touch panel|
JP2004021522A|2002-06-14|2004-01-22|Sony Corp|Apparatus, method, and program for information processing|
JP4182769B2|2002-06-21|2008-11-19|アイシン・エィ・ダブリュ株式会社|Image display device and navigation device|
US20090143141A1|2002-08-06|2009-06-04|Igt|Intelligent Multiplayer Gaming System With Multi-Touch Display|
US8479122B2|2004-07-30|2013-07-02|Apple Inc.|Gestures for touch sensitive input devices|
JP2006053678A|2004-08-10|2006-02-23|Toshiba Corp|Electronic equipment with universal human interface|
JP2006127488A|2004-09-29|2006-05-18|Toshiba Corp|Input device, computer device, information processing method, and information processing program|
CN101133385B|2005-03-04|2014-05-07|苹果公司|Hand held electronic device, hand held device and operation method thereof|
JP4730962B2|2006-04-04|2011-07-20|Kddi株式会社|Mobile terminal with touch panel display and program|
JP2006279968A|2006-04-05|2006-10-12|Hitachi Ltd|Video access apparatus and recording medium with video access program recorded thereon|
JP2008021094A|2006-07-12|2008-01-31|Kyocera Mita Corp|Operating device and image forming device|
US7791594B2|2006-08-30|2010-09-07|Sony Ericsson Mobile Communications Ab|Orientation based multiple mode mechanically vibrated touch screen display|
RU2451981C2|2006-10-23|2012-05-27|Ей Джин ОХ|Input device|
KR101391689B1|2006-12-28|2014-05-07|삼성전자주식회사|Method for providing menu comprising movable menu-set and multimedia device thereof|
KR101349309B1|2007-03-14|2014-01-23|엘지전자 주식회사|Mobile communication device and control method thereof|
DE602007012211D1|2007-06-08|2011-03-10|Research In Motion Ltd|Haptic display for an electronic handheld device|
US9740386B2|2007-06-13|2017-08-22|Apple Inc.|Speed/positional mode translations|
US8180295B2|2007-07-19|2012-05-15|Sony Computer Entertainment Inc.|Bluetooth enabled computing system and associated methods|
WO2009014521A1|2007-07-26|2009-01-29|Razer Pte Ltd|Programmable touch sensitive controller|
US20090027330A1|2007-07-26|2009-01-29|Konami Gaming, Incorporated|Device for using virtual mouse and gaming machine|
US20090193363A1|2008-01-30|2009-07-30|International Business Machines Corporation|Representing Multiple Computing Resources Within A Predefined Region Of A Graphical User Interface For Displaying A Single Icon|
KR100946460B1|2008-09-30|2010-03-10|현대자동차주식회사|Input device of vehicle|
US8451236B2|2008-12-22|2013-05-28|Hewlett-Packard Development Company L.P.|Touch-sensitive display screen with absolute and relative input modes|
US20100293457A1|2009-05-15|2010-11-18|Gemstar Development Corporation|Systems and methods for alphanumeric navigation and input|
KR101578430B1|2009-07-13|2015-12-18|엘지전자 주식회사|Portable terminal|
JP4956644B2|2010-05-31|2012-06-20|株式会社東芝|Electronic device and input control method|
US9411509B2|2010-12-29|2016-08-09|Microsoft Technology Licensing, Llc|Virtual controller for touch display|JP5237325B2|2010-04-28|2013-07-17|株式会社スクウェア・エニックス|Video game processing apparatus, video game processing method, and video game processing program|
US9423878B2|2011-01-06|2016-08-23|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9477311B2|2011-01-06|2016-10-25|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9471145B2|2011-01-06|2016-10-18|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
US9015641B2|2011-01-06|2015-04-21|Blackberry Limited|Electronic device and method of providing visual notification of a received communication|
US9465440B2|2011-01-06|2016-10-11|Blackberry Limited|Electronic device and method of displaying information in response to a gesture|
EP2487577A3|2011-02-11|2017-10-11|BlackBerry Limited|Presenting buttons for controlling an application|
US9766718B2|2011-02-28|2017-09-19|Blackberry Limited|Electronic device and method of displaying information in response to input|
JP5790203B2|2011-06-29|2015-10-07|ソニー株式会社|Information processing apparatus, information processing method, program, and remote operation system|
US9619038B2|2012-01-23|2017-04-11|Blackberry Limited|Electronic device and method of displaying a cover image and an application image from a low power condition|
JP6095393B2|2013-02-12|2017-03-15|株式会社スクウェア・エニックス|Video game apparatus and video game processing program|
JP6576245B2|2013-05-08|2019-09-18|株式会社スクウェア・エニックス・ホールディングス|Information processing apparatus, control method, and program|
AU2015241900B2|2014-04-04|2017-09-28|Colopl, Inc.|User interface program and game program|
USD771112S1|2014-06-01|2016-11-08|Apple Inc.|Display screen or portion thereof with graphical user interface|
JP5711409B1|2014-06-26|2015-04-30|ガンホー・オンライン・エンターテイメント株式会社|Terminal device|
KR102037481B1|2014-07-31|2019-10-28|삼성전자주식회사|Display apparatus, method of controlling the display apparatus and recordable medium storing for program for performing the method|
JP1518788S|2014-09-01|2015-03-09|
US9904463B2|2014-09-23|2018-02-27|Sulake Corporation Oy|Method and apparatus for controlling user character for playing game within virtual environment|
WO2016099317A1|2014-12-19|2016-06-23|Сергей Анатольевич ГОРИШНИЙ|Method and system for the visual management of data|
USD760746S1|2015-06-04|2016-07-05|Apple Inc.|Display screen or portion thereof with animated graphical user interface|
JP5981617B1|2015-08-20|2016-08-31|株式会社コロプラ|Computer program and computer-implemented method for displaying user interface images|
CN106730810B|2015-11-19|2020-02-18|网易(杭州)网络有限公司|Game button switching method and device of mobile intelligent terminal|
JP2017097674A|2015-11-25|2017-06-01|Necパーソナルコンピュータ株式会社|Information processing terminal and program|
CN105653189A|2015-12-28|2016-06-08|网宿科技股份有限公司|Method and system for virtualizing host handle on the basis of intelligent terminal|
KR20170122056A|2016-04-26|2017-11-03|삼성전자주식회사|Electronic device and method for inputting adaptive touch using display in the electronic device|
EP3285148A1|2016-08-19|2018-02-21|Bigben Interactive SA|Method for controlling a display element by a game console|
CN106975219B|2017-03-27|2019-02-12|网易(杭州)网络有限公司|Display control method and device, storage medium, the electronic equipment of game picture|
US10369470B2|2017-04-28|2019-08-06|PlayFusion Limited|User interface control cluster for enhancing a gaming experience|
CN108496151A|2017-05-23|2018-09-04|深圳市大疆创新科技有限公司|Method and apparatus for manipulating movable fixture|
CN108509141B|2018-03-30|2020-06-02|维沃移动通信有限公司|Control generation method and mobile terminal|
US11045719B2|2018-09-12|2021-06-29|King.Com Ltd.|Method and computer device for controlling a touch screen|
CN109343773B|2018-10-11|2021-07-09|广州要玩娱乐网络技术股份有限公司|Control method and device of portable touch equipment, storage medium and terminal|
USD902221S1|2019-02-01|2020-11-17|Apple Inc.|Electronic device with animated graphical user interface|
USD900871S1|2019-02-04|2020-11-03|Apple Inc.|Electronic device with animated graphical user interface|
JP6818091B2|2019-06-20|2021-01-20|株式会社コロプラ|Game programs, game methods, and information terminals|
US11216065B2|2019-09-26|2022-01-04|LenovoPte. Ltd.|Input control display based on eye gaze|
法律状态:
2013-07-30| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]|
2018-12-18| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2019-10-15| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-04-28| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2020-09-15| B16A| Patent or certificate of addition of invention granted|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 09/02/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
申请号 | 申请日 | 专利标题
US201161441335P| true| 2011-02-10|2011-02-10|
US61/441335|2011-02-10|
[返回顶部]